On the Maximum Partial Sum of Independent Random Variables

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Maximum Partial Sum of Independent Random Variables.

this becomes false if (BI), (Be) and (B) are replaced by (Nt), (NM) and (N), respectively. This follows, even for p = 2 = q, from the above example proving that (NW) is not linear. Correspondingly, (Ne) cannot be interpreted as the dual space of (NP), since such an interpretation would involve the definition of a scalar product. 7. Let (Nt) denote the space which relates to the space (Nt) in th...

متن کامل

Maximizing the Entropy of a Sum of Independent Random Variables

Let X1; : : : ;Xn be n independent, symmetric random variables supported on the interval [-1,1] and let Sn = Pn i=1Xi be their sum. We show that the di erential entropy of Sn is maximized when X1; : : : ;Xn 1 are Bernoulli taking on +1 or -1 with equal probability and Xn is uniformly distributed. This entropy maximization problem is due to Shlomo Shamai [1] who also conjectured the solution1.

متن کامل

Generating the Maximum of Independent Identically Distributed Random Variables

Frequently the need arises for the computer generation of variates that are exact/y distributed as 2 = max(X,, . , X.) where X,, . . . , X, form a sequence of independent identically distributed random variables. For large n the individual generation of the Xi’s is unfeasible, and the inversion-of-a-beta-variate is potentially inaccurate. In this paper, we discuss and compare the corrected inve...

متن کامل

On the maximum entropy of the sum of two dependent random variables

If pi(i = l , * . ,N)is the probability of the ith letter of amemoryless souree, the length li of the e o r r e s p d i n g binary HufFmancodeword can be very different from the d u e -logpi. For atypical letter, however, Ii = -log pi. Mote precisely, Pm=E {ill o g p -m)Pj < 2-" and P,'= zjE{il,,>-logp,+m)Pj <2-c(m15T1, &re; c = 227. Index Term.s-Huffman c...

متن کامل

On the Decrease Rate of the Non-Gaussianness of the Sum of Independent Random Variables

Several proofs of the monotonicity of the nonGaussianness (divergence with respect to a Gaussian random variable with identical second order statistics) of the sum of n independent and identically distributed (i.i.d.) random variables were published. We give an upper bound on the decrease rate of the non-Gaussianness which is proportional to the inverse of n, for large n. The proof is based on ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the National Academy of Sciences

سال: 1947

ISSN: 0027-8424,1091-6490

DOI: 10.1073/pnas.33.5.132